Name | Version | Summary | date |
batchee |
1.5.2 |
Determine how to group together input files into batches for subsequent concatenation |
2025-09-16 16:38:25 |
llm_batch_helper |
0.3.3 |
A Python package that enables batch submission of prompts to LLM APIs, with simplified interface and built-in async capabilities handled implicitly. |
2025-09-12 05:03:48 |
ChemInformant |
2.4.2 |
A robust and high-throughput Python client for the PubChem API, designed for automated data retrieval and analysis |
2025-09-10 06:25:23 |
bipl |
0.6.6 |
Openslide/libtiff/GDAL ndarray-like interface and lazy parallel tile-based processing |
2025-09-04 23:36:46 |
beque |
0.1.0 |
Asynchronous In-Memory Batch Queue Processor |
2025-08-30 03:45:06 |
speechmatics-batch |
0.4.1 |
Speechmatics Batch API Client |
2025-08-29 07:26:04 |
nimble-llm-caller |
0.2.2 |
A robust, multi-model LLM calling package with intelligent context management, file processing, and advanced prompt handling |
2025-08-22 05:40:15 |
batchata |
0.4.7 |
Unified Python API for AI batch requests with 50% cost savings on OpenAI and Anthropic |
2025-08-20 02:06:30 |
git-batch |
4.0.12 |
Clone single branch from all repositories listed in a file. |
2025-08-17 18:55:44 |
universal-printer |
3.0.0 |
Cross-platform document printing with enhanced PDF generation |
2025-08-09 12:04:01 |
openai-so-batch |
0.1.1 |
Python library for creating and managing OpenAI Structured Outputs batch API calls |
2025-08-04 17:53:55 |
azllm |
0.1.6 |
A Python package that provides an easier user interface for multiple LLM providers. |
2025-07-31 01:06:46 |
opsqueue |
0.30.2 |
Python client library for Opsqueue, the lightweight batch processing queue for heavy loads |
2025-07-30 08:19:58 |
m9ini |
1.0.4 |
m9 ini configuration |
2025-07-28 21:20:52 |
django-routines |
1.6.1 |
Define named groups of management commands in Django settings files for batched execution. |
2025-07-25 20:47:00 |
openai-batch |
0.3.2 |
Make OpenAI batch easy to use. |
2025-07-23 18:25:16 |
enhanced-chinese-translator |
1.0.1 |
High-performance Chinese to English translation tool with multi-threading and batch processing |
2025-07-23 07:16:54 |
model-ensembler |
0.6.2 |
Model Ensembler for managed batch workflows |
2025-07-22 15:31:57 |
m9lib |
1.0.1 |
m9 utility library |
2025-07-20 17:49:29 |
llm-batch-helper |
0.1.2 |
A Python package that enables batch submission of prompts to LLM APIs, with built-in async capabilities and response caching. |
2025-07-16 17:35:51 |